Fog Computing vs Edge Computing: Comparison of Two New Computing Models

September 07, 2021

Fog Computing vs Edge Computing: Comparison of Two New Computing Models

Computing has come a long way since the early days of punch cards and vacuum tubes. From mainframes to personal computers, to smartphones, computing has continually evolved to meet the needs of society. The newest iterations in the computing world are Fog and Edge computing models.

So, what exactly are these new computing models? How are they different from each other, and which is better? In this article, we will compare Fog computing and Edge computing to answer these questions.

Fog Computing

Fog computing is a distributed computing model that extends the capabilities of cloud computing to the edge of the network. It was introduced by Cisco in 2012 to address the limitations of cloud computing in some applications.

Fog computing enables computation, data storage, and applications to be distributed along the cloud-to-things continuum, just as computing resources are.

Fog computing has the following benefits:

  1. Reduced Latency: Fog computing reduces network latency by bringing computation, storage, and data closer to the edge of the network, reducing the need to send data back and forth to the cloud.
  2. Scalability: Fog computing enables the distribution of computation and storage resources, making it easier to scale systems than traditional centralized models.
  3. Improved Security: Fog computing can help reduce security threats by allowing organizations to carry out some processing in a more secure way on devices at the network's edge rather than on the cloud.

Edge Computing

Edge computing is a distributed computing model that brings computation and data storage closer to the devices at the edge of the network, reducing latency and bandwidth usage. Edge computing architecture moves the computation from the central cloud to the edge of the Internet closer to the data source.

The following are some benefits of edge computing:

  1. Low Latency: Edge computing reduces latency by performing computing and processing closer to the devices at the network's edge, thus reducing the distance traveled to access the cloud.
  2. Scalability: Edge computing provides a distributed architecture that enables organizations to distribute their computational and storage resources close to where they are needed.
  3. Reduced Bandwidth: Edge computing reduces the need for data to travel long distances across the network, freeing up bandwidth for other types of traffic.

Conclusion

Fog computing and Edge computing are two types of distributed computing models that bring computation and data storage closer to the devices at the network's edge, reducing latency, improving scalability, and increasing efficiency.

Edge computing is best suited for devices that generate or use data at the network edge, such as IoT devices, for analytical or operational purposes. Meanwhile, fog computing is ideal for more complex applications that require real-time computing or processing, such as smart transportation systems or mission-critical applications.

In the end, the answer to which is better, Edge computing or Fog computing, is dependent on the specific requirements of the application. Organizations should consider their requirements and choose the model that best suits their needs.

References

  1. Cisco. What's the Difference between Fog Computing and Edge Computing?
  2. Sahil Gupta. Edge Computing vs. Fog Computing
  3. Computerworld. How edge computing differs from cloud computing

© 2023 Flare Compare